1,125 research outputs found

    The Evolution of E-Inclusion: Technology in Education for the Vision-Impaired

    Get PDF
    The 1970s and 1980s saw a rapid take-up in the use of personal computers. During the same time period, society began to move towards providing equity for people with disabilities. As legislators around the world created new disability and Information Technology policies, more people with disabilities were given access to education and the evolving computing tools provided unprecedented educational opportunities. These opportunities were due to the use of new technologies such as outputting of electronic text to voice synthesizers. The provision of assistive technology was not only helpful; it also provided education through a medium that was previously unavailable, particular to the blind and vision impaired. For much of the 1980s the development of text-processing sensory technologies, connected to personal computers, led to a closer equality between the educational services of the able-bodied and people with disabilities. Unfortunately this evolution as not without notable difficulties: issues surrounding the cost of products, the lack of support from large corporations and choice of platform resulted in substantial difficulties for educators in the assessment of appropriate technology. In addition, many of these products became largely redundant in the late-1980s as corporations began to place more emphasis on the Graphical User Interface (GUI). Although the GUI was remarkably successful in allowing the general public to gain better access to personal computing, it’s non-text nature once again caused a digital divide for people with disabilities. Although it is clear that the evolution of the personal computer has had a significant impact on the provision of education for people with disabilities, this paper highlights the historical repetition where innovation is prioritized above e-inclusion

    Slice sampling covariance hyperparameters of latent Gaussian models

    Get PDF
    The Gaussian process (GP) is a popular way to specify dependencies between random variables in a probabilistic model. In the Bayesian framework the covariance structure can be specified using unknown hyperparameters. Integrating over these hyperparameters considers different possible explanations for the data when making predictions. This integration is often performed using Markov chain Monte Carlo (MCMC) sampling. However, with non-Gaussian observations standard hyperparameter sampling approaches require careful tuning and may converge slowly. In this paper we present a slice sampling approach that requires little tuning while mixing well in both strong- and weak-data regimes.Comment: 9 pages, 4 figures, 4 algorithms. Minor corrections to previous version. This version to appear in Advances in Neural Information Processing Systems (NIPS) 23, 201

    Driving Markov chain Monte Carlo with a dependent random stream

    Full text link
    Markov chain Monte Carlo is a widely-used technique for generating a dependent sequence of samples from complex distributions. Conventionally, these methods require a source of independent random variates. Most implementations use pseudo-random numbers instead because generating true independent variates with a physical system is not straightforward. In this paper we show how to modify some commonly used Markov chains to use a dependent stream of random numbers in place of independent uniform variates. The resulting Markov chains have the correct invariant distribution without requiring detailed knowledge of the stream's dependencies or even its marginal distribution. As a side-effect, sometimes far fewer random numbers are required to obtain accurate results.Comment: 16 pages, 4 figure

    Market Conditions and Retirement of Physical Capital: Evidence fron Oil Tankers

    Get PDF
    The endogeneity of capital retirements is studied for the particular case of oil tankers from 1979--1989. A model is estimated to examine the effect of changes in market conditions on the price and scrappage of tankers. Energy price rises had a major impact on the value of ships and on which ships were scrapped. A simple model is able to account for many features of the market. We use the information implicit in second-hand prices to ease the computational burden for the model that is estimated.

    Instructional eLearning technologies for the vision impaired

    Get PDF
    The principal sensory modality employed in learning is vision, and that not only increases the difficulty for vision impaired students from accessing existing educational media but also the new and mostly visiocentric learning materials being offered through on-line delivery mechanisms. Using as a reference Certified Cisco Network Associate (CCNA) and IT Essentials courses, a study has been made of tools that can access such on-line systems and transcribe the materials into a form suitable for vision impaired learning. Modalities employed included haptic, tactile, audio and descriptive text. How such a multi-modal approach can achieve equivalent success for the vision impaired is demonstrated. However, the study also shows the limits of the current understanding of human perception, especially with respect to comprehending two and three dimensional objects and spaces when there is no recourse to vision

    Application of the Linear-Quadratic model to targeted radionuclide therapy

    Get PDF
    PhDThe principal aim of this work was to test the hypothesis that the Linear-Quadratic (LQ) model of cell survival, developed for external beam radiotherapy (EBRT), could be extended to targeted radionuclide therapy (TRT) in order to predict dose-response relationships. The secondary aim was to establish the relevance of particular radiobiological phenomena to TRT and relate these results to any deviations from the response predicted by the LQ Model. Methods: Cancer cell lines were treated with either EBRT or an in-vitro model of TRT. Dosimetry for the TRT was calculated using radiation transport simulations with the Monte Carlo PENELOPE code. Clonogenic as well as functional biological assays were used to assess cell response. Results: Accurate dosimetry for in-vitro exposures of cell cultures to radioactivity was established. LQ parameters of cell survival were established for cancer cell lines reported to be prone to apoptosis, low dose hypersensitivity (LDH) or the bystander effect. For apoptotic cells and cells exhibiting a bystander effect in response to EBRT, LQ parameters were found to be predictive of cell response to TRT. Apoptosis was not found to be a mode of cell death more specific to TRT than to EBRT. Bystander effects could not be demonstrated in cells exposed to TRT. Exposure to low doses of radiation may even protect against the bystander effect. The LQ model was not predictive of cell response in cells previously shown to exhibit LDH. This led to a development of the LQ model based upon a threshold dose-rate for maximum repair. However, the current explanation of LDH may not explain the inverse dose-rate response. Conclusion: The LQ model of cell survival to radiation has been shown to be largely predictive of response to low dose-rate irradiation. However, in cells displaying LDH, further adaptation of the model was require

    Incorporating Side Information in Probabilistic Matrix Factorization with Gaussian Processes

    Get PDF
    Probabilistic matrix factorization (PMF) is a powerful method for modeling data associated with pairwise relationships, finding use in collaborative filtering, computational biology, and document analysis, among other areas. In many domains, there is additional information that can assist in prediction. For example, when modeling movie ratings, we might know when the rating occurred, where the user lives, or what actors appear in the movie. It is difficult, however, to incorporate this side information into the PMF model. We propose a framework for incorporating side information by coupling together multiple PMF problems via Gaussian process priors. We replace scalar latent features with functions that vary over the space of side information. The GP priors on these functions require them to vary smoothly and share information. We successfully use this new method to predict the scores of professional basketball games, where side information about the venue and date of the game are relevant for the outcome.Comment: 18 pages, 4 figures, Submitted to UAI 201

    How biased are maximum entropy models?

    Get PDF
    Maximum entropy models have become popular statistical models in neuroscience and other areas in biology, and can be useful tools for obtaining estimates of mutual information in biological systems. However, maximum entropy models fit to small data sets can be subject to sampling bias; i.e. the true entropy of the data can be severely underestimated. Here we study the sampling properties of estimates of the entropy obtained from maximum entropy models. We show that if the data is generated by a distribution that lies in the model class, the bias is equal to the number of parameters divided by twice the number of observations. However, in practice, the true distribution is usually outside the model class, and we show here that this misspecification can lead to much larger bias. We provide a perturbative approximation of the maximally expected bias when the true model is out of model class, and we illustrate our results using numerical simulations of an Ising model; i.e. the second-order maximum entropy distribution on binary data.

    The Use of Programmed Learning Materials to Investigate Learning Processes in Difficult Areas in School Chemistry

    Get PDF
    Abstract Not Provided
    corecore